home

Editorial
Today's News
News Archives
On-line Articles
Current Issue
Current Abstracts
Magazine Archives
Subscribe to ISD


Directories:
Vendor Guide 2000
Advertiser Index
EDA Web Directory
Literature Guide
Event Calendar


Resources:
Resources and Seminars
Special Sections
High-tech Job Search


Information:
2001 Media Kit
About isdmag.com
Writers Wanted!
Search isdmag.com
Contact Us







The Dream - Communications/Core-Based Design

Take heart, EDA tool skeptics, and consider an analyst's vision of hope based on EDA's past, present, and future design at the system level. By Gary Smith


The road to electronic system level (ESL) design has neither been straight nor has it been level. "Rocky" is the term that comes to mind when describing the ESL design path-just look at the past six years of sales figures (see Figure 1). We're now seeing the introduction of the third generation of ESL tools and the design community is, in general, fairly skeptical.

The reaction is, "Why are these going to be any different than the last two generations of tools?" That's a good question. Certainly the tools only look like extensions of the last generation's tools. Nothing strikingly new is on the market. But there is a difference-that difference is the Dream.

This will be more of a story than an article. Just like ESL, it'll take twists and turns and, no doubt, more than one detour. Hopefully, at the end, it will give you an appreciation of an exciting future that will not only bring the EDA Industry out of today's gloom, but will give the EDA industry the prominence it has long deserved, while causing a systems explosion that will change the world.

Sophia Antipolis and U. C. Berkeley

Sophia Antipolis is an idyllic village on the French Riviera, not the sort of place that you would consider a hotbed of innovation. Maybe it shouldn't be so surprising-Silicon Valley was, until the 1970s, called the "Valley of the Heart's Delight." Sophia has been or is presently the home of a major Texas Instruments design center, the TIMA think tank, a major VLSI Tech/Compass development center, a major development center for Mentor Graphics, and no doubt many more that I've missed. Some of the more innovative system companies in Europe-notably Alcatel and ST Microelectronics-have been customers for the intellectual stew brewing in Sophia. Just like Silicon Valley, you can't really give all the credit to one company or one individual. Just like Silicon Valley, in a very real sense, the location has become the innovator. This work is concentrating on a methodology driven by cores and reuse. At Dataquest we've named this methodology "core-based design."

Figure 1 - EDA by Methodology
While RTL Design has been the industry's mainstay, gainining 544% in revenue over gate-level tools, ESL design is slowly searching for its place.

Six thousand miles away, in another small town, exists the second brain trust driving the Dream. This one is no big surprise; it's based at the University of California at Berkeley, long a driver in the electronics world. Today, professors such as Kurt Keutzer, Alberto Sangiovanni-Vincentelli, Richard Newton, Ed Lee, Jan Rabaey, and Dennis Sylvester are developing the methodologies and, in many cases, the new tools needed for tomorrow's design work. One of the primary catalysts was a European Automotive development project taken on by Alberto for the Cadence Berkeley Development Labs. This program was called "Felix"-Cadence opened a design center in Rome to work on the Felix project. Alberto calls this new methodology "communications-based design" and it has a strong software development element. Keep in mind, ESL design is the concurrent design of both hardware and software. In fact, until you get to the partitioning phase, there is no hardware or software design; it's just logic design. You might have noticed the strong European connection. It's fairly evident that Europe has become the driver for the new ESL design methodologies.

Design styles and methodologies

Actually, the recent ascent of Europe as an electronic system methodology leader should come as no surprise either. By the early 1990s, the then three major electronic design regions had settled into three different design styles.

Japan had moved to what they called structured RTL (register transfer level) design, really just gate-level design using the newer RTL design tools. This is your basic bottom-up design and, by refusing to move up to true RTL design, they found themselves dropping further and further behind the U.S. electronic system vendors.

The U.S. was now, once again, the king of the hill.

However, the cell phone vendors were in for a big surprise. What had allowed the U.S. to pass Japan was the RTL design methodology. The RTL design methodology made the development of the 100,000-plus gate ASICs and ASSPs possible-this brought new levels of complexity, allowing the system vendors the ability to win the competitive war of functionality. The problem with RTL design is that it forces you into a middle-out design style. You may know what you want to design, but after you start the actual design work, you need to determine if you can build it (verify down). You must also keep checking if what you are designing is what you intended to design in the first place (verify up). This isn't the most efficient design style and, in fact, many of these middle-out designs missed their design targets. This caused the systems to be delivered with less than optimal functionality and caused the U.S. cell phone manufacturers to be jumped by Ericsson and Nokia.

Europe has always been system oriented. Because of this, they have spent most of the last half of the century in the shadow of the Americans and the Japanese. The Americans, and to some extent the Japanese, became centered around semiconductors, rather than system oriented. That worked until the 1990s, as the state of design allowed the semiconductor content to dominate the functionality of the system. The cell phone was the first major electronic product not driven by semiconductor content, which allowed the European vendors to take advantage of their system-level expertise and take the lead in the cell phone market. The basic Ericsson system design is almost ten years old now. The elegance of that system design has allowed it to grow with the market, not only producing market winning cell phones year after year, but allowing Ericsson to spin out derivative designs at three-to-four month intervals.

I'm sure that, if you take a look at Nokia, you'll see the same level of system-design expertise. And this has all been done with little to no automation! Now you have a glimpse of the Dream.

Before we get to the Dream, we need to look at the software development problem. The embedded software development tool (ESDT) industry has stagnated, though certainly not in sales-the industry is growing at a comfortable rate of 13.5 percent. It's the technology that has flattened out.

The ESDT industry is busy consolidating, expanding its present product lines and building integrated development environments (IDEs) in order to further tool interoperability and raise its average selling prices. This looks extremely familiar to those of who used EDA tools during the 1980s. Change the term "IDEs" to" frameworks" and you have exactly the strategy that was followed by Daisy, Mentor and Valid (DMV), just before Synopsys and Cadence took over the EDA market. More on that later.

The software hole

As I stated before, the ESL design methodology really applies to the conceptual design at the behavioral level and the partitioning of the design into hardware and software at the architectural level.

The software problem? There is no corresponding design methodology equivalent to hardware's RTL design flow. Therefore, you can hand off the hardware portion of the partitioning to the RTL tool set, but there is no corresponding tool flow in the software development environment. You must manually translate the design down to the ESDT tools that correspond to hardware's gate level. We are calling this the "software hole." There is a candidate to solve this problem and that is the unified modeling language (UML) and the companies driving this language are I-Logix, Telelogic, and Rational. Unlike Verilog and VHDL, UML isn't quite up to the task. Semantics standards must be developed and some hierarchical issues need to be resolved. Today, it looks a lot like C/C++ does in ESL design. Some further standardization is needed before it's usable by multiple companies. Every system vendor's C/C++ coding style is different enough to prevent sharing of models and tools. This is what the SystemC initiative is trying to do in the hardware world. Unfortunately, where SystemC has been proceeding at a breakneck pace, UML seems to be moving along at the normal, slow pace typical of the standards development community.

So who will solve the problem? Interestingly enough, the answer could easily come out of the EDA industry. Now this has been said before and as far as the ESDT vendors are concerned, it was a paper dragon. In 1995, Cadence, Synopsys, and Mentor were all considering buying an ESDT vendor.

The only acquisition that went through was Mentor's purchase of Microtec. That acquisition is generally considered a failure. But if you look at the market from the HW/SW co-verification angle, the answer isn't that clear cut. Mentor has been leading that market for a few years now. If you combine their HW/SW co-verification sales with their ESDT sales, they come in number three in ESDT sales. The question is why would you add a hardware tool to the ESDT market? The reason is that it's becoming evident that HW/SW co-verification isn't really HW/SW co-verification; it's software verification.

Test and verification are often considered fairly boring topics. Although the details might put you to sleep, the general topic can be very interesting. One of the surprises we found when Daya Nadamuni (Dataquest) began to cover the ESDT market is that, generally speaking, software developers don't verify their designs-they test them.

That could be considered an inflammatory statement until you compare them to the hardware design world. Given a preference, the hardware designer wouldn't test his design; he'd only verify it. In essence, the hardware designer and the software developer occupy the two opposite poles of the test/verification spectrum. The reason is fairly simple. On the one hand, the hardware designer is usually the user of the resulting silicon and, more often than not, isn't responsible for the manufacturing of the silicon. The software developer, on the other hand, is generally the manufacturer of the software.

This is probably a good time for some definitions. The difference between "test" and "verification" is often overlooked-and they aren't synonymous. You test the device, HW or SW, to insure that the implementation works. In semiconductors, that means the actual silicon doesn't have opens or shorts. In software, it means that the various software modules interact properly and there are no system crashes or lock-ups.

Verification, ideally, checks to see if the hardware or software meets the requirements of the original specification. The user is highly interested in meeting the specification, while the manufacturer is highly interested in a working product-two different considerations. The way the hardware manufacturers solve the problem, at least in the semiconductor world, is to force the design engineer to place test structures in his design.

This is done with design-for-test (DFT) tools. To be honest, the design engineers hate it, but they must play by the rules if they want to get silicon. What we found in the software development world is that the software hole has made verification almost impossible. This is why software developers must wait for the hardware before they can start verification-or in their world, integration.

In order to allow software verification to start prior to the completion of the hardware, some EDA vendors, primarily Mentor and Synopsys, have come up with HW/SW co-verification tools. This way the software developers can start verifying their software at the RTL. The original idea, in HW/SW co-verification, was that at this level you had the option to change either the hardware or the software, enabling much greater system optimization. The reality of HW/SW co-verification is that the hardware design is, for all practical purposes, never changed. It's the software that's modified. That means that the HW/SW co-verification tools are actually ESDT tools, not EDA tools. Not only that, they are RTL ESDT tools and they are starting to fill the software hole that the ESDT vendors left vacant. Looks like the ESDT vendors have been out flanked when they weren't looking-reminiscent of DMV in the 1980s.

Communications/core-based design

In the beginning of this article, we talked about the core-based design methodology, coming out of Sophia Antipolis and the communications-based methodology coming out of Berkeley.

So what's the difference? The answer is basically nothing. Communications-based design is a top-down approach to the ESL design challenge. Core-based design is a bottom-up approach to the same problem. And they both have come to the same conclusions.

Communications-based design, as we noted, has more of a software content. As a top-down look at the design challenge this is to be expected. What's interesting is that the Mentor Group, in Sophia Antipolis, has found that you can place software modules in your core repositor just as easily as you can place silicon cores. In essence, think of an RTOS, or a driver, as a core.

We need to take a short detour here to discuss platform-based design. This is mainly because Cadence, San Jose, uses the phrase "platform-based design" to describe Cadence's Berkeley Labs communication-based design project. It isn't. The distinction is important.

Platform-based designs appear whenever the design community moves to a new methodology.

This is my fourth experience with platforms. Platforms allow an engineer to produce a design using a new methodology, in a fixed architecture, targeted for an application. Semiconductor vendors usually develop these platforms so they may sell silicon into an emerging market. Platform-based design typically lasts for two design cycles and then is abandoned by the design community. The reason is simple.

Platform-based design acts, in essence, as training wheels for the designer. After two or three designs, he is comfortable with the design methodology and he discards his training wheels. Why do you discard training wheels? So you can go faster.

In the design world, that means you can explore alternative architectures to give you a performance or a functionality edge on your competition.

Of course, there are the claims that hardware is now a commodity and that true functional advantages will be gained through software. That could be a very long discussion in itself, but to keep this succinct, I'll just say that's hogwash. What Cadence, San Jose, is trying to say is that VCC (the name given to the Felix tool suite) allows you to develop your own platform. In the design world, that is called an architecture. What makes platform-based design work is the fixed architecture. In other words, the fixed architecture acts as the training wheels.

Once you have a fixed architecture, you can set interface standards that allow components to be plugged into the design. That makes it a fast and fairly easy way of knocking out a new design. The problem is the inflexibility. Since you can't modify the architecture, everyone that uses that platform will turn out a design at the same level of performance (offering no competitive advantage).

That means the platform will be abandoned as soon as engineers feel comfortable with the application. They will then be free to modify the architecture to gain the competitive advantage they seek.

Why did RTL methodology take off?

As we said earlier, let's get back to Synopsys and Cadence. Synopsys didn't have the first synthesizer nor did Gateway (purchased by Cadence) have the first HDL simulator. As is true today, past engineers were skeptical and didn't see any breakthrough technology.

Most ASIC vendors saw Design Compiler as a great netlist translator. That meant they could go out and steal LSI Logic's customers. Fortunately, LSI Logic saw it as something completely different. They saw Synopsys and Cadence enabling a new methodology that would give the design engineer the ability to make a significant leap into a new world of design possibilities. This wasn't just an extension of the gate-level methodology; it was something altogether new. They also saw it as a great chance to fill up the increasing gate counts being made available by the new silicon process technologies. You now had three companies, one with considerable influence and reach into the design community, evangelizing the RTL methodology.

Keep in mind, Synopsys and Cadence didn't sell a point tool-they sold a Dream. I don't use the word evangelizing lightly. If you were there at the time, you saw the new religion of RTL design replace the old religion of gate-level design. You also saw the demise of the "old church" of DMV.

So back to the original question: "Why is this new generation of ESL tools going to be any different than the last two? After all, they're just extensions of the old RTL tools."

First, we are starting to see the "old religion" crumble. The consolidation of the RTL and the combination of those tools with IC layout tools-creating the new implementation tool set-will completely change the industry. Second, we once again need a new methodology to fill up the increasing gate counts due to the new silicon process technologies.

Sound familiar? The real difference, though, is the Dream. Once you have a Dream, things happen. Let's look at the ESL Dream.

The Dream

When a systems architect begins a new design, he has four to six possible architectures in mind. Yes, we checked. Keep in mind today-this is a manual process. Now, what if he was given a database and a search engine? What if he was building an electronic toaster and he searched for "electronic toaster?" Now if the database was something like the repository concept that has come out of Mentor Sophia Antipolis, and if the search engine does what Mentor and Synchronicity believes a search engine should do, you get some surprising results.

Figure 2 - The Dream
The methodology starts the dream and then the tools that color the dream need to appear to produce reality.

Once you have automated the architectural search, you will get 40 to 60 possible architectures. That brings a lot to the party! First of all, you can use the new tools, being developed by Cadence and Synopsys, to do "What If" analysis and determine the optimum architecture. But that's just the start. You also have enough variants of that architecture that you can build a product roadmap. Additionally, you can set up your architecture to allow you to spin out derivative designs. Sounds like what's happening in the cell phone market, doesn't it? Last but not least, you can then take a look at the fringe architectures-those furthest from the design goal-and with some luck, come up with ideas for three or four related product lines.

Oh, and by the way, the repository will tell you what cores you have on hand (so you can build each suggested architecture) and what cores now will be needed to complete your design. This allows you to determine how long it will take to complete the design phase of each of the proposed architectures. Keep in mind time-to-market is a major part of the design specification. You feed this information to the new tools coming out of Berkeley and you complete the design. After that, the design is handed off to the RTL implementation team, as the RTL will no longer be the design space.

So there you have the ESL Dream (see Figure 2). The new tools are just new tools until you know how to use them. The Dream is the idea that develops the methodology. The new tools then start to fill out the tool flow needed to implement that methodology. And the end result will be an explosion of systems designs that will change the world.


Gary Smith is chief analyst for Dataquest's EDA service of the software group. He is responsible for all research, publications, and client projects relating to the EDA marketplace. He is also involved in research and consulting projects in the emerging methodologies in RTL and core-based design.

To voice an opinion on this or any other article in Integrated System Design, please e-mail your comments to sdean@cmp.com.


Send electronic versions of press releases to news@isdmag.com
For more information about isdmag.com e-mail webmaster@isdmag.com
Comments on our editorial are welcome.
Copyright © 2000 Integrated System Design Magazine

Sponsor Links

All material on this site Copyright © 2000 CMP Media Inc. All rights reserved.